Learning Generative ConvNet with Continuous Latent Factors by Alternating Back-Propagation

نویسندگان

  • Tian Han
  • Yang Lu
  • Song-Chun Zhu
  • Ying Nian Wu
چکیده

The supervised learning of the discriminative convolutional neural network (ConvNet or CNN) is powered by back-propagation on the parameters. In this paper, we show that the unsupervised learning of a popular top-down generative ConvNet model with latent continuous factors can be accomplished by a learning algorithm that consists of alternatively performing back-propagation on both the latent factors and the parameters. The model is a non-linear generalization of factor analysis, where the high-dimensional observed data vector, such as an image, is assumed to be the noisy version of a vector generated by a non-linear transformation of a low-dimensional vector of continuous latent factors. Furthermore, it is assumed that these latent factors follow known independent distributions, such as standard normal distributions, and the non-linear transformation is assumed to be parametrized by a top-down ConvNet, which is capable of approximating the highly non-linear mapping from the latent factors to the image. We explore a simple and natural learning algorithm for this model that alternates between the following two steps: (1) inferring the latent factors by Langevin dynamics or gradient descent, and (2) updating the parameters of the ConvNet by gradient descent. Step (1) is based on the gradient of the reconstruction error with respect to the latent factors, which is readily available by back-propagation. We call this step inferential back-propagation. Step (2) is based on the gradient of the reconstruction error with respect to the parameters, and is also obtained by back-propagation. We refer to this step as learning back-propagation. The code for inferential back-propagation in (1) is actually part of the code for learning back-propagation in (2), and thus the inferential back-propagation in (1) is actually a by-product of the learning backpropagation in (2). We show that such an alternating back-propagation algorithm can learn realistic generative models of natural images and sounds.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Alternating Back-Propagation for Generator Network

This paper proposes an alternating back-propagation algorithm for learning the generator network model. The model is a nonlinear generalization of factor analysis. In this model, the mapping from the continuous latent factors to the observed signal is parametrized by a convolutional neural network. The alternating back-propagation algorithm iterates the following two steps: (1) Inferential back...

متن کامل

Exploring Generative Perspective of Convolutional Neural Networks by Learning Random Field Models

This paper is a case study of the convolutional neural network (ConvNet or CNN) from a statistical modeling perspective. The ConvNet has proven to be a very successful discriminative learning machine. In this paper, we explore the generative perspective of the ConvNet. We propose to learn Markov random field models called FRAME (Filters, Random field, And Maximum Entropy) models using the highl...

متن کامل

Cooperative Training of Descriptor and Generator Networks

This paper studies the cooperative training of two probabilistic models of signals such as images. Both models are parametrized by convolutional neural networks (ConvNets). The first network is a descriptor network, which is an exponential family model or an energy-based model, whose feature statistics or energy function are defined by a bottom-up ConvNet, which maps the observed signal to the ...

متن کامل

A Theory of Generative ConvNet

The convolutional neural network (ConvNet or CNN) is a powerful discriminative learning machine. In this paper, we show that a generative random field model that we call generative ConvNet can be derived from the discriminative ConvNet. The probability distribution of the generative ConvNet model is in the form of exponential tilting of a reference distribution. Assuming re-lu non-linearity and...

متن کامل

Stochastic Back-propagation and Variational Inference in Deep Latent Gaussian Models

We marry ideas from deep neural networks and approximate Bayesian inference to derive a generalised class of deep, directed generative models, endowed with a new algorithm for scalable inference and learning. Our algorithm introduces a recognition model to represent approximate posterior distributions, and that acts as a stochastic encoder of the data. We develop stochastic backpropagation – ru...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1606.08571  شماره 

صفحات  -

تاریخ انتشار 2016